Spectral convergence of the connection Laplacian from random samples
نویسندگان
چکیده
Spectral methods that are based on eigenvectors and eigenvalues of discrete graph Laplacians, such as DiffusionMaps and Laplacian Eigenmaps, are often used for manifold learning and nonlinear dimensionality reduction. It was previously shown by Belkin&Niyogi (2007, Convergence of Laplacian eigenmaps, vol. 19. Proceedings of the 2006 Conference on Advances in Neural Information Processing Systems. The MIT Press, p. 129.) that the eigenvectors and eigenvalues of the graph Laplacian converge to the eigenfunctions and eigenvalues of the Laplace–Beltrami operator of the manifold in the limit of infinitely many data points sampled independently from the uniform distribution over the manifold. Recently, we introduced Vector Diffusion Maps and showed that the connection Laplacian of the tangent bundle of the manifold can be approximated from random samples. In this article, we present a unified framework for approximating other connection Laplacians over the manifold by considering its principle bundle structure. We prove that the eigenvectors and eigenvalues of these Laplacians converge in the limit of infinitely many independent random samples. We generalize the spectral convergence results to the case where the data points are sampled from a non-uniform distribution, and for manifolds with and without boundary.
منابع مشابه
On the Convergence of Spectral Clustering on Random Samples: The Normalized Case
Given a set of n randomly drawn sample points, spectral clustering in its simplest form uses the second eigenvector of the graph Laplacian matrix, constructed on the similarity graph between the sample points, to obtain a partition of the sample. We are interested in the question how spectral clustering behaves for growing sample size n. In case one uses the normalized graph Laplacian, we show ...
متن کاملConvergence of Laplacian spectra from random samples
Eigenvectors and eigenvalues of discrete graph Laplacians are often used for manifold learning and nonlinear dimensionality reduction. It was previously proved by Belkin and Niyogi [3] that the eigenvectors and eigenvalues of the graph Laplacian converge to the eigenfunctions and eigenvalues of the Laplace-Beltrami operator of the manifold in the limit of infinitely many data points sampled ind...
متن کاملSIGNLESS LAPLACIAN SPECTRAL MOMENTS OF GRAPHS AND ORDERING SOME GRAPHS WITH RESPECT TO THEM
Let $G = (V, E)$ be a simple graph. Denote by $D(G)$ the diagonal matrix $diag(d_1,cdots,d_n)$, where $d_i$ is the degree of vertex $i$ and $A(G)$ the adjacency matrix of $G$. The signless Laplacianmatrix of $G$ is $Q(G) = D(G) + A(G)$ and the $k-$th signless Laplacian spectral moment of graph $G$ is defined as $T_k(G)=sum_{i=1}^{n}q_i^{k}$, $kgeqslant 0$, where $q_1$,$q_2$, $cdots$, $q_n$ ...
متن کاملConvergence of Laplacian Eigenmaps
Geometrically based methods for various tasks of data analysis have attracted considerable attention over the last few years. In many of these algorithms, a central role is played by the eigenvectors of the graph Laplacian of a data-derived graph. In this paper, we show that if points are sampled uniformly at random from an unknown submanifold M of RN , then the eigenvectors of a suitably const...
متن کاملEfficient Sampling for Gaussian Graphical Models via Spectral Sparsification
Motivated by a sampling problem basic to computational statistical inference, we develop a toolset based on spectral sparsification for a family of fundamental problems involving Gaussian sampling, matrix functionals, and reversible Markov chains. Drawing on the connection between Gaussian graphical models and the recent breakthroughs in spectral graph theory, we give the first nearly linear ti...
متن کامل